Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

نویسندگان

  • R. C. Venkatesan
  • A. Plastino
چکیده

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information-principle using the dual generalized K-Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K-Ld, with normal averages constraints, is shown to exhibit distinctly unique features.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pyth...

متن کامل

Scaled Bregman divergences in a Tsallis scenario

There exist two different versions of the Kullback-Leibler divergence (K-Ld) in Tsallis statistics, namely the usual generalized K-Ld and the generalized Bregman K-Ld. Problems have been encountered in trying to reconcile them. A condition for consistency between these two generalized K-Ld-forms by recourse to the additive duality of Tsallis statistics is derived. It is also shown that the usua...

متن کامل

Proximity Function Minimization Using Multiple Bregman Projections, with Applications to Split Feasibility and Kullback-Leibler Distance Minimization

Problems in signal detection and image recovery can sometimes be formulated as a convex feasibility problem (CFP) of finding a vector in the intersection of a finite family of closed convex sets. Algorithms for this purpose typically employ orthogonal or generalized projections onto the individual convex sets. The simultaneous multiprojection algorithm of Censor and Elfving for solving the CFP,...

متن کامل

Metrics Defined by Bregman Divergences †

Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...

متن کامل

An Abstract Weighting Framework for Clustering Algorithms

Recent works in unsupervised learning have emphasized the need to understand a new trend in algorithmic design, which is to influence the clustering via weights on the instance points. In this paper, we handle clustering as a constrained minimization of a Bregman divergence. Theoretical results show benefits resembling those of boosting algorithms, and bring new modified weighted versions of cl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011